On the Pseudo Multilayer Learning of Backpropagation
نویسندگان
چکیده
Rosenblatt's convergence theorem for the simple perceptron initiated much excitement about iterative weight modifying neural networks. However, this convergence only holds for the class of linearly separable functions, which is vanishingly small compared to arbitrary functions. With multilayer networks of nonlinear units it is possible, though not guaranteed, to solve arbitrary functions. Backpropagation is a method of training multilayer networks to converge to the solution of arbitrary functions. This paper describes how classification takes place in single and multilayer networks using threshold or sigmoid nodes. It then shows that the current backpropagation method can only do effective learning on one layer of a network at a time.
منابع مشابه
Speed-up of error backpropagation algorithm with class-selective relevance
Selective attention learning is proposed to improve the speed of the error backpropagation algorithm for fast speaker adaptation. Class-selective relevance for measuring the importance of a hidden node in a multilayer Perceptron is employed to selectively update the weights of the network, thereby reducing the computational cost for learning. c © 2002 Elsevier Science B.V. All rights reserved.
متن کاملDiscrete All-positive Multilayer Perceptrons for Optical Implementation Discrete All-positive Multilayer Perceptrons for Optical Implementation
All-optical multilayer perceptrons diier in various ways from the ideal neural network model. Examples are the use of non-ideal activation functions which are truncated, asymmetric, and have a non-standard gain, restriction of the network parameters to non-negative values, and the limited accuracy of the weights. In this paper, a backpropagation-based learning rule is presented that compensates...
متن کاملThe Backpropagation Algorithm Functions for the Multilayer Perceptron
The attempts for solving linear unseparable problems have led to different variations on the number of layers of neurons and activation functions used. The backpropagation algorithm is the most known and used supervised learning algorithm. Also called the generalized delta algorithm because it expands the training way of the adaline network, it is based on minimizing the difference between the ...
متن کاملAn ]Efficient Multilayer Quadratic Perceptron for Pattern Classification and Function Approximation
Abs t rac t : W e propose an architecture of a multilayer quadratic perceptron (MLQP) that combines advantages of multilayer perceptrons(MLPs) and higher-order feedforward neural networks. The features of MLQP are in its simple structure, practical number of adjustable connection weights and powerful learning ability. I n this paper, the architecture of MLQP is described, a backpropagation lear...
متن کامل4 . Multilayer perceptrons and back - propagation
Multilayer feed-forward networks, or multilayer perceptrons (MLPs) have one or several " hidden " layers of nodes. This implies that they have two or more layers of weights. The limitations of simple perceptrons do not apply to MLPs. In fact, as we will see later, a network with just one hidden layer can represent any Boolean function (including the XOR which is, as we saw, not linearly separab...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1989